3,138 research outputs found
Feeling Happier When Paying More: Dysfunctional Counterfactual Thinking in Consumer Affect
In this research the authors examine whether counterfactual thinking, the process of imagining alternatives to reality, can have a detrimental impact on consumers’ feelings. Five studies examine the dysfunctional role of counterfactual thinking in the presence of Minimum Purchase Requirement conditional message framing (“X% off all purchases if you spend at least $Y”), and its affective consequences. Results show that the presence or absence of the minimum amount restriction (Studies 1A and 1B), success or failure to meet the restriction (Studies 2A and 2B), and perceived closeness (i.e., outcome proximity) to success or failure in meeting the restriction (Study 3), drastically influence consumer affect to the extent that participants receiving an inferior deal exhibited higher satisfaction than those receiving a superior deal. It is suggested that such promotion-induced counterfactual thinking polarizes consumer satisfaction, which may impede consumers from arriving at optimal conclusions. © 2010 Wiley Periodicals, In
Correlation Between Student Collaboration Network Centrality and Academic Performance
We compute nodal centrality measures on the collaboration networks of
students enrolled in three upper-division physics courses, usually taken
sequentially, at the Colorado School of Mines. These are complex networks in
which links between students indicate assistance with homework. The courses
included in the study are intermediate Classical Mechanics, introductory
Quantum Mechanics, and intermediate Electromagnetism. By correlating these
nodal centrality measures with students' scores on homework and exams, we find
four centrality measures that correlate significantly with students' homework
scores in all three courses: in-strength, out-strength, closeness centrality,
and harmonic centrality. These correlations suggest that students who not only
collaborate often, but also collaborate significantly with many different
people tend to achieve higher grades. Centrality measures between simultaneous
collaboration networks (analytical vs. numerical homework collaboration)
composed of the same students also correlate with each other, suggesting that
students' collaboration strategies remain relatively stable when presented with
homework assignments targeting different skills. Additionally, we correlate
centrality measures between collaboration networks from different courses and
find that the four centrality measures with the strongest relationship to
students' homework scores are also the most stable measures across networks
involving different courses. Correlations of centrality measures with exam
scores were generally smaller than the correlations with homework scores,
though this finding varied across courses.Comment: 10 pages, 4 figures, submitted to Phys. Rev. PE
Quasiseparable Approach to Evaluating Cubic Splines
The development of fast and efficient algorithms is crucial not only for computer scientists, but also for mathematicians and engineers as those algorithms lead to reduce complexity. Another common interest of these professionals is to construct models using existing data. This leads numerical analysts to explore interpolation techniques. One such technique is called cubic spline interpolation. In here, we will propose a cubic spline solver aiming to bridge the gap between numerical linear algebra, electrical engineering, systems engineering, sensor processing, and parallel processing. We will use quasiseparable structure to evaluate cubic splines by deriving a fast and stable algorithm. The derivation is carried through a specific factorization of the inverse of tridiagonal matrices. This factorization leads to an alternative method to solve the system of tridiagonal matrices as opposed to the existing methods. The proposed algorithm has the lowest computational complexity compared to existing algorithms
The Challenge of Time-Predictability in Modern Many-Core Architectures
The recent technological advancements and market trends are causing an interesting phenomenon towards the convergence of High-Performance Computing (HPC) and Embedded Computing (EC) domains. Many recent HPC applications require huge amounts of information to be processed within a bounded amount of time while EC systems are increasingly concerned with providing higher performance in real-time. The convergence of these two domains towards systems requiring both high performance and a predictable time-behavior challenges the capabilities of current hardware architectures. Fortunately, the advent of next-generation many-core embedded platforms has the chance of intercepting this converging need for predictability and high-performance, allowing HPC and EC applications to be executed on efficient and powerful heterogeneous architectures integrating general-purpose processors with many-core computing fabrics. However, addressing this mixed set of requirements is not without its own challenges and it is now of paramount importance to develop new techniques to exploit the massively parallel computation capabilities of many-core platforms in a predictable way
Comparison of Quadratic- and Median-Based Roughness Penalties for Penalized-Likelihood Sinogram Restoration in Computed Tomography
We have compared the performance of two different penalty choices
for a penalized-likelihood sinogram-restoration strategy we have
been developing. One is a quadratic penalty we have employed
previously and the other is a new median-based penalty. We
compared the approaches to a noniterative adaptive filter that
loosely but not explicitly models data statistics. We found that
the two approaches produced similar resolution-variance tradeoffs
to each other and that they outperformed the adaptive filter in
the low-dose regime, which suggests that the particular choice of
penalty in our approach may be less important than the fact that
we are explicitly modeling data statistics at all. Since the
quadratic penalty allows for derivation of an algorithm that is
guaranteed to monotonically increase the penalized-likelihood
objective function, we find it to be preferable to the median-based penalty
Fotografía sobre Alzheimer y la percepción visual en visitantes del Museo de Arte de Lima, Lima, 2020
El presente trabajo de investigación tuvo como objetivo determinar la relación
existente entre fotografía sobre Alzheimer y la percepción visual en visitantes del
Museo de Arte de Lima, Lima, 2020, así mismo tiene una metodología de tipo
aplicada, según su enfoque es cuantitativa, de un diseño, no experimental, por
último, según su alcance temporal es transversal. De tal manera que se decidió
elaborar un fotolibro acerca de la enfermedad del Alzheimer a visitantes del MALI.
En la cual tuvo como población infinita y contando con una muestra de 68 personas
entre hombres y mujeres en el rango de edad 18 – 29 años, así mismo se elaboró
un cuestionario con 19 ítems que tenían que ser respondidas con las alternativas
en escala de Likert. Como resultado se obtuvo que, sí existe relación entre
fotografía sobre Alzheimer y la percepción visual en visitantes del Museo de Arte
de Lima, Lima, 2020, ya que el resultado del Chi-cuadrado de Pearson el valor de
significancia fue de 0,000 < 0,05, es por ello que se acepta la hipótesis de
investigación y se rechaza la hipótesis nula
Quantitative Impact Evaluation of the WINNN Programme – Volume 1: Main Findings. Operations Research and Impact Evaluation
This report presents the results of the quantitative impact evaluation of the Working to Improve Nutrition in Northern Nigeria (WINNN) programme. The impact evaluation is conducted by the Operations Research and Impact Evaluation (ORIE) project. ORIE is responsible for undertaking operations research and assessing the impact of the WINNN programme and it is led by Oxford Policy Management (OPM) and implemented in collaboration with other institutions.UK Department for International Developmen
- …